Smoothing methods for convex inequalities and linear complementarity problems
نویسندگان
چکیده
A smooth approximation p(x;) to the plus function: maxfx; 0g, is obtained by integrating the sigmoid function 1=(1 + e ?x), commonly used in neural networks. By means of this approximation, linear and convex inequalities are converted into smooth, convex unconstrained minimization problems, the solution of which approximates the solution of the original problem to a high degree of accuracy for suuciently large. In the special case when a Slater constraint qualiication is satissed, an exact solution can be obtained for nite. Speedup over MINOS 5.4 was as high as 1142 times for linear inequalities of size 2000 1000, and 580 times for convex inequalities with 400 variables. Linear complementarity problems are converted into a system of smooth nonlinear equations and are solved by a quadrat-ically convergent Newton method. For monotone LCP's with as many as 10,000 variables, the proposed approach was as much as 63 times faster than Lemke's method.
منابع مشابه
A new look at smoothing Newton methods for nonlinear complementarity problems and box constrained variational inequalities
In this paper we take a new look at smoothing Newton methods for solving the nonlinear complementarity problem (NCP) and the box constrained variational inequalities (BVI). Instead of using an infinite sequence of smoothing approximation functions, we use a single smoothing approximation function and Robinson’s normal equation to reformulate NCP and BVI as an equivalent nonsmooth equation H(u, ...
متن کاملSome Fundamental Properties of Successive Convex Relaxation Methods on LCP and Related Problems
General Successive Convex Relaxation Methods (SRCMs) can be used to compute the convex hull of any compact set, in an Euclidean space, described by a system of quadratic inequalities and a compact convex set. Linear Complementarity Problems (LCPs) make an interesting and rich class of structured nonconvex optimization problems. In this paper, we study a few of the specialized lift-and-project m...
متن کاملAn interior point potential reduction method for constrained equations
We study the problem of solving a constrained system of nonlinear equations by a combination of the classical damped Newton method for (unconstrained) smooth equations and the recent interior point potential reduction methods for linear programs, linear and nonlin-ear complementarity problems. In general, constrained equations provide a uniied formulation for many mathematical programming probl...
متن کاملSmoothing Projected Gradient Method and Its Application to Stochastic Linear Complementarity Problems
A smoothing projected gradient (SPG) method is proposed for the minimization problem on a closed convex set, where the objective function is locally Lipschitz continuous but nonconvex, nondifferentiable. We show that any accumulation point generated by the SPG method is a stationary point associated with the smoothing function used in the method, which is a Clarke stationary point in many appli...
متن کاملCut-Generating Functions and S-Free Sets
We consider the separation problem for sets X that are inverse images of a given set S by a linear mapping. Classical examples occur in integer programming, complementarity problems and other optimization problems. One would like to generate valid inequalities that cut off some point not lying in X , without reference to the linear mapping. Formulas for such inequalities can be obtained through...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Math. Program.
دوره 71 شماره
صفحات -
تاریخ انتشار 1995